16 research outputs found

    Percolation on two- and three-dimensional lattices

    Full text link
    In this work we apply a highly efficient Monte Carlo algorithm recently proposed by Newman and Ziff to treat percolation problems. The site and bond percolation are studied on a number of lattices in two and three dimensions. Quite good results for the wrapping probabilities, correlation length critical exponent and critical concentration are obtained for the square, simple cubic, HCP and hexagonal lattices by using relatively small systems. We also confirm the universal aspect of the wrapping probabilities regarding site and bond dilution.Comment: 15 pages, 6 figures, 3 table

    Percolation model for structural phase transitions in Li1x_{1-x}Hx_xIO3_3 mixed crystals

    Full text link
    A percolation model is proposed to explain the structural phase transitions found in Li1x_{1-x}Hx_xIO3_3 mixed crystals as a function of the concentration parameter xx. The percolation thresholds are obtained from Monte Carlo simulations on the specific lattices occupied by lithium atoms and hydrogen bonds. The theoretical results strongly suggest that percolating lithium vacancies and hydrogen bonds are indeed responsible for the solid solution observed in the experimental range 0.22<x<0.360.22 < x < 0.36.Comment: 4 pages, 2 figure

    Propagation of nuclear data uncertainty: Exact or with covariances

    No full text
    Two distinct methods of propagation for basic nuclear data uncertainties to large scale systems will be presented and compared. The “Total Monte Carlo” method is using a statistical ensemble of nuclear data libraries randomly generated by means of a Monte Carlo approach with the TALYS system. These libraries are then directly used in a large number of reactor calculations (for instance with MCNP) after which the exact probability distribution for the reactor parameter is obtained. The second method makes use of available covariance files and can be done in a single reactor calculation (by using the perturbation method). In this exercise, both methods are using consistent sets of data files, which implies that covariance files used in the second method are directly obtained from the randomly generated nuclear data libraries from the first method. This is a unique and straightforward comparison allowing to directly apprehend advantages and drawbacks of each method. Comparisons for different reactions and criticality-safety benchmarks from 19F to actinides will be presented. We can thus conclude whether current methods for using covariance data are good enough or not

    Propagation of nuclear data uncertainty: Exact or with covariances

    No full text
    Two distinct methods of propagation for basic nuclear data uncertainties to large scale systems will be presented and compared. The “Total Monte Carlo” method is using a statistical ensemble of nuclear data libraries randomly generated by means of a Monte Carlo approach with the TALYS system. These libraries are then directly used in a large number of reactor calculations (for instance with MCNP) after which the exact probability distribution for the reactor parameter is obtained. The second method makes use of available covariance files and can be done in a single reactor calculation (by using the perturbation method). In this exercise, both methods are using consistent sets of data files, which implies that covariance files used in the second method are directly obtained from the randomly generated nuclear data libraries from the first method. This is a unique and straightforward comparison allowing to directly apprehend advantages and drawbacks of each method. Comparisons for different reactions and criticality-safety benchmarks from 19F to actinides will be presented. We can thus conclude whether current methods for using covariance data are good enough or not

    INTRODUCTION OF OSCAR-4 AT THE HIGH FLUX REACTOR (PETTEN)

    Get PDF
    Since 2005 the nodal diffusion based code system, OSCAR-3, was used for reactor support calculations of operational cycles of the High Flux Reactor in Petten, The Netherlands. OSCAR uses a two-step deterministic calculation, in which homogenized cross sections are generated in lattice environments using neutron transport simulations, and then passed to a nodal diffusion core simulator to model the full reactor. Limitations in OSCAR-3 led to the need for improved modelling capabilities and better physics models for components present in the reactor core. OSCAR-4 offers improvements over OSCAR-3 in its approach to homogenization, and the new version of the diffusion core simulator allows for better modelling of movable components such as control rods. Fuel inventories calculated using OSCAR-4 can also easily be exported to MCNP, which allows the calculation of individual plate powers and local reaction rates amongst others. For these reasons OSCAR-4 is currently being introduced as a core support tool at the High Flux Reactor. In this work the steps that were followed to validate the reactor models are presented, and include results of validation calculations from both OSCAR-4 and MCNP6 over multiple reactor cycles. In addition differences in cross section library evaluations and their impact on the results are presented for the MCNP model

    The TENDL library: Hope, reality and future

    No full text
    The TALYS Evaluated Nuclear Data Library (TENDL) has now 8 releases since 2008. Considerable experience has been acquired for the production of such general-purpose nuclear data library based on the feedback from users, evaluators and processing experts. The backbone of this achievement is simple and robust: completeness, quality and reproducibility. If TENDL is extensively used in many fields of applications, it is necessary to understand its strong points and remaining weaknesses. Alternatively, the essential knowledge is not the TENDL library itself, but rather the necessary method and tools, making the library a side product and focusing the efforts on the evaluation knowledge. The future of such approach will be discussed with the hope of nearby greater success

    The TENDL library : Hope, reality and future

    No full text
    The TALYS Evaluated Nuclear Data Library (TENDL) has now 8 releases since 2008. Considerable experience has been acquired for the production of such general-purpose nuclear data library based on the feedback from users, evaluators and processing experts. The backbone of this achievement is simple and robust: completeness, quality and reproducibility. If TENDL is extensively used in many fields of applications, it is necessary to understand its strong points and remaining weaknesses. Alternatively, the essential knowledge is not the TENDL library itself, but rather the necessary method and tools, making the library a side product and focusing the efforts on the evaluation knowledge. The future of such approach will be discussed with the hope of nearby greater success
    corecore